An Immersive VR Therapy Experience


Project Overview
Project Type: Group (6 developers, 10 weeks)
Focus: VR Functionalities
Role: Lead Programmer
Game Engine: Unreal 4
Language: Unreal Blueprint
Description:
Lumino is a VR immersive therapy experience made for Hospice Savannah, aimed at improving quality of
life using guided meditation and tai chi. In a peaceful grassy clearing lit by the northern lights,
players can watch their breath push the grass, flow with the fireflies to create music, and relax.
Features:
▪ Hand Tracking
▪ Jar Grabbing
▪ Responsive Grass Swaying
▪ Firefly Touching
▪ Event Progression
▪ Voice-over System
Feature 01 - Hand Tracking
Our target players are people in hospice. So hand tracking would be useful if they can't or don't want
to use motion controller. With hand tracking, you can just put your hand in the range of mo-cap
cameras.
Hand Tracking 1
Hand Tracking 2
I built a blueprint for bare hand tracking (but without collision).
Hand BP Construction
UE supports both hand tracking and motion controller mode. It can intelligently switch between the two
modes,
depending on whether the motion controller is actively using.
The hand tracking is very sensitive to light. So if the environment is too dark, it won’t work
properly.
Hand Tracking in Test

Hand Tracking 1

Hand Tracking 2

Hand BP Construction
The hand tracking is very sensitive to light. So if the environment is too dark, it won’t work properly.
Hand Tracking in Test
Feature 02 - Collision and Detachment
One main interaction in our VR experience is to open up a jar as a trigger of moving to next part.
So to simulate jar open and close is the first step for this requirement.
Collision and Detachment 1
Collision and Detachment 2
I simulated Jar body and Lid detachment and placing back interaction (using cubes as place holder,
with motion controllers).
Hand BP Construction
I created a Blueprint Interface and BPI functions, Pickup and Drop. These interfaces can be
implemented in any object needed to be picked up.
In my case, I just implemented in both Jar Body and Lid actors.

Collision and Detachment 1

Collision and Detachment 2

Hand BP Construction
In my case, I just implemented in both Jar Body and Lid actors.
Feature 03 - Grass Swaying
We want to create the experience for player that breathe can affect the grass, making it swaying based
on the facing direction of the player.
Grass Swaying to the Left
Grass Swaying to the Right
The grass model is just a static mesh. Instead of changing the position and rotation to the model,
using material would be a good choice, since physical collision with the grass is not needed in this
project.
Grass with Material
I implemented grass swaying effect in material blueprint using the World Position Offset node in
Grass Material BP.
Grass Material BP (World Position Offset)
Grass Material Parameters
I extracted 6 parameters:
1. Variety
2. Shaking Freq
3. Initial Shaking Freq
4. Initial Shaking Amp
5. Wind Intensity Direction X
6. Wind Intensity Direction Y
Control Parameters in Grass BP
In Grass BP, I can easily use "Set Parameter Value" node to adjust those parameters I created before.
The final effect is like this:
Grass Swaying in Test Scene

Grass Swaying to the Left

Grass Swaying to the Right

Grass with Material

Grass Material BP (World Position Offset)

Grass Material Parameters
1. Variety
2. Shaking Freq
3. Initial Shaking Freq
4. Initial Shaking Amp
5. Wind Intensity Direction X
6. Wind Intensity Direction Y

Control Parameters in Grass BP
Grass Swaying in Test Scene
Feature 04 - Jar Interaction
New design requirement is that the player can choose to:
▪ Take the jar (body) with one hand and then open it (lid) with the
other
hand.
▪ Just open the lid directly with one hand.
Our artists have created the jar model and material. Based on "Feature 02 - Collision and Detachment",
now I can use the real assets to replace those placeholders.
But there will be a little collision issue for models, when you want to just open the lid with one
hand.
Adjusted Jar Body Collision
Lid and Body Collisions
I implemented Jar interaction BP with Jar static meshes and materials in Lumino scene.
I refined relative locations and collision sizes to have accurate interactions. So it will be less
likely to get a missing
grab or wrong grab. Also, players can be optional to take the jar body to open the lid.
Jar Interaction on Test
Jar Interaction with Model
Jar Interaction in Scene
▪ Take the jar (body) with one hand and then open it (lid) with the other hand.
▪ Just open the lid directly with one hand.
Our artists have created the jar model and material. Based on "Feature 02 - Collision and Detachment", now I can use the real assets to replace those placeholders.
But there will be a little collision issue for models, when you want to just open the lid with one hand.

Adjusted Jar Body Collision

Lid and Body Collisions
I refined relative locations and collision sizes to have accurate interactions. So it will be less likely to get a missing grab or wrong grab. Also, players can be optional to take the jar body to open the lid.
Jar Interaction on Test
Jar Interaction with Model
Jar Interaction in Scene
Feature 05 - Grab Detection
Until now, the hand interactions are tested using motion controllers.
By default, the motion controller can detect hand open and close, but not for hand tracking.
So we need to build our own functions to detect the current state of the hand is "Grab" or "Release".
Release to Grab (in value)
Grab to Release (in value)
I made an algorithm for detecting hand close and open based on hand tracking.
Detecting Grabbing Algorithm
The algorithm is to calculate the average distances of all finger tips to the root stub of the hand.
Then mapped into the range from 0 to 1.
Release to Grab (in state)
Grab to Release (in state)
I printed the value to screen to see the changing value of Detecting Grabbing Algorithm when the state
of hand changed from release to grab and from grab to release.
After that, I used a threshold variable to precisely define the state of grabbing or releasing.
Hand BP OnGrab & OnRelease
In Hand BP, OnGrab and OnRelease are two events which are bound to BPC Hand with the same name OnGrab
and OnRelease.
Hand BPC Tick
They will be ticked in Hand BPC to detect the state of grabbing or releasing.
The hand tracking grabbing feature is like this:
Grabbing Jar without Hand Pose
By default, the motion controller can detect hand open and close, but not for hand tracking.
So we need to build our own functions to detect the current state of the hand is "Grab" or "Release".
.png)
Release to Grab (in value)
.png)
Grab to Release (in value)

Detecting Grabbing Algorithm
.png)
Release to Grab (in state)
.png)
Grab to Release (in state)
After that, I used a threshold variable to precisely define the state of grabbing or releasing.

Hand BP OnGrab & OnRelease

Hand BPC Tick
Grabbing Jar without Hand Pose
Feature 06 - Hand Pose
To make the jar interaction more realistic, we want to when the player is holding the jar body or lid,
the hand pose is like really holding the object, instead of overlapping the object.
Interaction without Hand Pose
Interaction with Hand Pose
My process of adding animations for hand poses:
1. In the Hand Skeletal Mesh, create new poses.
Create New Pose in SM
2. Create new socket for a certain pick up actor.
3. Add preview asset, tweak the relative location, rotation, and scale
for the hand and the pick-up
object.
Add Socket and Preview Asset in SM
4. Apply new animation sequence for hand pose to the pick-up actor
blueprint.
Apply Pose to Actor BP
The idea is to create 2 hand static meshes. When the state is Grabbing, show grabbing hand pose. When
the state is Releasing, show releasing hand pose.
Hand BP SetHandPose & RestoreTracking
In Hand BP, event SetHandPose is for showing Poseable Mesh and hiding original Skeleton Mesh, while
event RestoreTracking is for the opposite functions.
The final effect is like this:
Grabbing Jar with Hand Pose

Interaction without Hand Pose

Interaction with Hand Pose
1. In the Hand Skeletal Mesh, create new poses.

Create New Pose in SM
3. Add preview asset, tweak the relative location, rotation, and scale for the hand and the pick-up object.

Add Socket and Preview Asset in SM

Apply Pose to Actor BP

Hand BP SetHandPose & RestoreTracking
Grabbing Jar with Hand Pose
Feature 07 - Firefly Touching
Another core interaction is firefly touching. Fireflies are in the trail of a line. The player can
touch them along the trail, like doing tai-chi movements. Every touch will trigger a sound effect
(piano key sound). As a whole, they are forming a melody.
Fireflies in Line
Fireflies when touched
Firefly Trail
The fireflies in the line will appear in sequence with delays.
But by default, Unreal for-loop doesn't support delay. So I created a macro for-loop with delay.
Macro For Loop with Delay
Use Macro For Loop
With the customized delay for-loop, it's easy to spawn fireflies in sequence and then spawn the trail
along the firefly spline in Pathline BP.
I bound a custom event of destroying all fireflies to OnDestroyed of the actor in order to control the
lifetime of the firefly group after the player has touched all fireflies in group.
Fireflies Spawn Events

Fireflies in Line

Fireflies when touched

Firefly Trail
But by default, Unreal for-loop doesn't support delay. So I created a macro for-loop with delay.

Macro For Loop with Delay

Use Macro For Loop
I bound a custom event of destroying all fireflies to OnDestroyed of the actor in order to control the lifetime of the firefly group after the player has touched all fireflies in group.

Fireflies Spawn Events
Feature 08 - Voice-over System
Along the experience, voice-over guides the player to key events and interactions.
After talking with our sound designer, I proposed to merge some scripts into one, because they are
always playing together in sequence. This will make the overall voice-over system simpler.
Voice-over Scripts
I discussed with sound designer again about naming convention. We used "DX_number_keywords" to name
all sound waves. Before that, it's confusing to name files with what will play after.
Voice-over Sound Waves
We used the first prompt of opening the jar to determine whether the player could interact with the
game or not, and had 2 versions of the sequence of the game. If the player doesn't interact, the
player will no longer be prompted to reach out and touch fireflies, so it adapts to the ability of the
patient.
The overall voice-over scripts as well as interaction events are organized in sequence like this:
Voice-over Sequence in Level BP

Voice-over Scripts

Voice-over Sound Waves
The overall voice-over scripts as well as interaction events are organized in sequence like this:

Voice-over Sequence in Level BP
Playthrough Test
Reflection
▪ It’s always better to test the technical process beforehand. This is the number one key
takeaway
from the perspective of developers in the VR project.
▪ Planning ahead and making a list in priority order are also very important. With limited
amount of
people and time, developing should have a specific guide, and should be updated on time.
▪ After testing for one key technical point, the feedback and adjustment for the original
design is
very needed. Because not always can we get positive feedback.
▪ I learned that in the term of development, the framework is equal important or even more
important than the content that will fill in later. That’s why we need placeholders along the
way.
▪ The power of collaboration is very beyond expectation. Each student in my team has individual
talent. By making the use of everyone's ability, we can finally deliver a decent project. Even
though with the executable hand tracking problem, I think we all did great job. The development
experience will be an impressive piece of memory in my life.
▪ Planning ahead and making a list in priority order are also very important. With limited amount of people and time, developing should have a specific guide, and should be updated on time.
▪ After testing for one key technical point, the feedback and adjustment for the original design is very needed. Because not always can we get positive feedback.
▪ I learned that in the term of development, the framework is equal important or even more important than the content that will fill in later. That’s why we need placeholders along the way.
▪ The power of collaboration is very beyond expectation. Each student in my team has individual talent. By making the use of everyone's ability, we can finally deliver a decent project. Even though with the executable hand tracking problem, I think we all did great job. The development experience will be an impressive piece of memory in my life.